135 research outputs found
Designing a Belief Function-Based Accessibility Indicator to Improve Web Browsing for Disabled People
The purpose of this study is to provide an accessibility measure of
web-pages, in order to draw disabled users to the pages that have been designed
to be ac-cessible to them. Our approach is based on the theory of belief
functions, using data which are supplied by reports produced by automatic web
content assessors that test the validity of criteria defined by the WCAG 2.0
guidelines proposed by the World Wide Web Consortium (W3C) organization. These
tools detect errors with gradual degrees of certainty and their results do not
always converge. For these reasons, to fuse information coming from the
reports, we choose to use an information fusion framework which can take into
account the uncertainty and imprecision of infor-mation as well as divergences
between sources. Our accessibility indicator covers four categories of
deficiencies. To validate the theoretical approach in this context, we propose
an evaluation completed on a corpus of 100 most visited French news websites,
and 2 evaluation tools. The results obtained illustrate the interest of our
accessibility indicator
Evaluating Conformance to WCAG 2.0: Open Challenges
Web accessibility for people with disabilities is a highly visible area of work in the field of ICT accessibility, including many policy activities in several countries. The commonly accepted guidelines for web accessibility (WCAG 1.0) were published in 1999 and have been extensively used by designers, evaluators and legislators. A new version of these guidelines (WCAG 2.0) was published in 2008. In this paper we point out the main challenges that WCAG 2.0 raises for web accessibility evaluators: the concept of "accessibility supported technologies"; success criteria testability; technique and failure openness, and the aggregation of partial results. We conclude the paper with some recommendations for the future
Subjektivne stimulacije u funkciji razvoja postularnog refleksnog mehanizma kod djeteta s cerebralnom paralizom
. It is widely accepted that the consistency of adaptive interfaces is crucial for their usability. Many threats for consistency in adaptive applications have been reported in the literature so far (e.g., consistency of adaptation methods and techniques, consistency of the user model). In this paper we argue that many, if not all, user modeling systems that have been developed so far are substantially threatening consistency by offering no adequate means for communicating consistency contexts. This is especially the case for user modeling servers, which are supposed to serve several applications in parallel. In order to prevent consistency problems in user modeling systems, we introduce basic concepts and techniques from transaction management. User modeling systems that adhere to the principles of transaction management can be expected to provide a reliable source of user information for adaptive applications, especially in real world settings. Introduction One of the cen..
Usability and digital inclusion: standards and guidelines
This article aims at discussing e-government website usability in relation to concerns about
digital inclusion. E-government web design should consider all aspects of usability, including
those that make it more accessible to all. Traditional concerns of social exclusion are being
superseded by fears that lack of digital competence and information literacy may result in dangerous
digital exclusion. Usability is considered as a way to address this exclusion and should
therefore incorporate inclusion and accessibility guidelines. This article makes an explicit link
between usability guidelines and digital inclusion and reports on a survey of local government
web presence in Portugal
The EU-ToxRisk method documentation, data processing and chemical testing pipeline for the regulatory use of new approach methods
Hazard assessment, based on new approach methods (NAM), requires the use of batteries of assays, where individual tests may be contributed by different laboratories. A unified strategy for such collaborative testing is presented. It details all procedures required to allow test information to be usable for integrated hazard assessment, strategic project decisions and/or for regulatory purposes. The EU-ToxRisk project developed a strategy to provide regulatorily valid data, and exemplified this using a panel of > 20 assays (with > 50 individual endpoints), each exposed to 19 well-known test compounds (e.g. rotenone, colchicine, mercury, paracetamol, rifampicine, paraquat, taxol). Examples of strategy implementation are provided for all aspects required to ensure data validity: (i) documentation of test methods in a publicly accessible database; (ii) deposition of standard operating procedures (SOP) at the European Union DB-ALM repository; (iii) test readiness scoring accoding to defined criteria; (iv) disclosure of the pipeline for data processing; (v) link of uncertainty measures and metadata to the data; (vi) definition of test chemicals, their handling and their behavior in test media; (vii) specification of the test purpose and overall evaluation plans. Moreover, data generation was exemplified by providing results from 25 reporter assays. A complete evaluation of the entire test battery will be described elsewhere. A major learning from the retrospective analysis of this large testing project was the need for thorough definitions of the above strategy aspects, ideally in form of a study pre-registration, to allow adequate interpretation of the data and to ensure overall scientific/toxicological validity.Toxicolog
- …